Search results for " 60J05"

showing 6 items of 6 documents

Coupled conditional backward sampling particle filter

2020

The conditional particle filter (CPF) is a promising algorithm for general hidden Markov model smoothing. Empirical evidence suggests that the variant of CPF with backward sampling (CBPF) performs well even with long time series. Previous theoretical results have not been able to demonstrate the improvement brought by backward sampling, whereas we provide rates showing that CBPF can remain effective with a fixed number of particles independent of the time horizon. Our result is based on analysis of a new coupling of two CBPFs, the coupled conditional backward sampling particle filter (CCBPF). We show that CCBPF has good stability properties in the sense that with fixed number of particles, …

65C05FOS: Computer and information sciencesStatistics and ProbabilityunbiasedMarkovin ketjutTime horizonStatistics - Computation01 natural sciencesStability (probability)backward sampling65C05 (Primary) 60J05 65C35 65C40 (secondary)010104 statistics & probabilityconvergence rateFOS: MathematicsApplied mathematics0101 mathematicscouplingHidden Markov model65C35Computation (stat.CO)Mathematicsstokastiset prosessitBackward samplingSeries (mathematics)Probability (math.PR)Sampling (statistics)conditional particle filterMonte Carlo -menetelmätRate of convergence65C6065C40numeerinen analyysiStatistics Probability and UncertaintyParticle filterMathematics - ProbabilitySmoothing
researchProduct

Variable length Markov chains and dynamical sources

2010

Infinite random sequences of letters can be viewed as stochastic chains or as strings produced by a source, in the sense of information theory. The relationship between Variable Length Markov Chains (VLMC) and probabilistic dynamical sources is studied. We establish a probabilistic frame for context trees and VLMC and we prove that any VLMC is a dynamical source for which we explicitly build the mapping. On two examples, the ``comb'' and the ``bamboo blossom'', we find a necessary and sufficient condition for the existence and the unicity of a stationary probability measure for the VLMC. These two examples are detailed in order to provide the associated Dirichlet series as well as the gener…

MSC 60J05 MSC 37E05[MATH.MATH-PR] Mathematics [math]/Probability [math.PR]Probability (math.PR)[MATH.MATH-DS]Mathematics [math]/Dynamical Systems [math.DS][ MATH.MATH-DS ] Mathematics [math]/Dynamical Systems [math.DS]Probabilistic dynamical sources[MATH.MATH-DS] Mathematics [math]/Dynamical Systems [math.DS]Dynamical Systems (math.DS)Variable length Markov chainsOccurrences of words[MATH.MATH-PR]Mathematics [math]/Probability [math.PR]60J05 37E05FOS: MathematicsMathematics - Dynamical SystemsDynamical systems of the intervalDirichlet series[ MATH.MATH-PR ] Mathematics [math]/Probability [math.PR]Mathematics - Probability
researchProduct

Establishing some order amongst exact approximations of MCMCs

2016

Exact approximations of Markov chain Monte Carlo (MCMC) algorithms are a general emerging class of sampling algorithms. One of the main ideas behind exact approximations consists of replacing intractable quantities required to run standard MCMC algorithms, such as the target probability density in a Metropolis-Hastings algorithm, with estimators. Perhaps surprisingly, such approximations lead to powerful algorithms which are exact in the sense that they are guaranteed to have correct limiting distributions. In this paper we discover a general framework which allows one to compare, or order, performance measures of two implementations of such algorithms. In particular, we establish an order …

Statistics and ProbabilityFOS: Computer and information sciences65C05Mathematical optimizationMonotonic function01 natural sciencesStatistics - ComputationPseudo-marginal algorithm010104 statistics & probabilitysymbols.namesake60J05martingale couplingalgoritmitFOS: MathematicsApplied mathematics60J220101 mathematicsComputation (stat.CO)Mathematics65C40 (Primary) 60J05 65C05 (Secondary)Martingale couplingMarkov chainmatematiikkapseudo-marginal algorithm010102 general mathematicsProbability (math.PR)EstimatorMarkov chain Monte Carloconvex orderDelta methodMarkov chain Monte CarloOrder conditionsymbolsStatistics Probability and UncertaintyAsymptotic variance60E15Martingale (probability theory)Convex orderMathematics - ProbabilityGibbs sampling
researchProduct

Uniform ergodicity of the iterated conditional SMC and geometric ergodicity of particle Gibbs samplers

2018

We establish quantitative bounds for rates of convergence and asymptotic variances for iterated conditional sequential Monte Carlo (i-cSMC) Markov chains and associated particle Gibbs samplers. Our main findings are that the essential boundedness of potential functions associated with the i-cSMC algorithm provide necessary and sufficient conditions for the uniform ergodicity of the i-cSMC Markov chain, as well as quantitative bounds on its (uniformly geometric) rate of convergence. Furthermore, we show that the i-cSMC Markov chain cannot even be geometrically ergodic if this essential boundedness does not hold in many applications of interest. Our sufficiency and quantitative bounds rely on…

Statistics and ProbabilityMetropoliswithin-Gibbsgeometric ergodicity01 natural sciencesCombinatorics010104 statistics & probabilitysymbols.namesakeFOS: MathematicsMetropolis-within-GibbsApplied mathematicsErgodic theory0101 mathematicsGibbs measureQAMathematics65C40 (Primary) 60J05 65C05 (Secondary)Particle GibbsMarkov chainGeometric ergodicity010102 general mathematicsErgodicityuniform ergodicityProbability (math.PR)iterated conditional sequential Monte CarloMarkov chain Monte CarloIterated conditional sequential Monte CarloRate of convergencesymbolsUniform ergodicityparticle GibbsParticle filterMathematics - ProbabilityGibbs sampling
researchProduct

CENTRAL LIMIT THEOREM FOR BIFURCATING MARKOV CHAINS

2020

Bifurcating Markov chains (BMC) are Markov chains indexed by a full binary tree representing the evolution of a trait along a population where each individual has two children. We first provide a central limit theorem for general additive functionals of BMC, and prove the existence of three regimes. This corresponds to a competition between the reproducing rate (each individual has two children) and the ergodicity rate for the evolution of the trait. This is in contrast with the work of Guyon (2007), where the considered additive functionals are sums of martingale increments, and only one regime appears. Our first result can be seen as a discrete time version, but with general trait evoluti…

[MATH.MATH-PR]Mathematics [math]/Probability [math.PR][MATH.MATH-PR] Mathematics [math]/Probability [math.PR]fluctuations for tree indexed Markov chain60J80[STAT.TH] Statistics [stat]/Statistics Theory [stat.TH]Bifurcating Markov chains60F05binary trees62G05[STAT.TH]Statistics [stat]/Statistics Theory [stat.TH]bifurcating auto-regressive process62F12density estimation Mathematics Subject Classification (2020): 60J05
researchProduct

Central limit theorem for bifurcating Markov chains under L 2 -ergodic conditions

2021

Bifurcating Markov chains (BMC) are Markov chains indexed by a full binary tree representing the evolution of a trait along a population where each individual has two children. We provide a central limit theorem for additive functionals of BMC under L 2-ergodic conditions with three different regimes. This completes the pointwise approach developed in a previous work. As application, we study the elementary case of symmetric bifurcating autoregressive process, which justify the non-trivial hypothesis considered on the kernel transition of the BMC. We illustrate in this example the phase transition observed in the fluctuations.

fluctuations for tree indexed Markov chain60J80[MATH.MATH-PR] Mathematics [math]/Probability [math.PR]Bifurcating Markov chains60F05binary treesbifurcating auto-regressive processdensity estimation Mathematics Subject Classification (2020): 60J05
researchProduct